Disengaging attention sets the temporal limit of attentive tracking

نویسندگان

  • Jeroen S. Benjamins
  • Ignace T.C. Hooge
  • Maarten J. van der Smagt
  • Frans A.J. Verstraten
چکیده

At first sight, recent studies investigating the temporal limits of attentive tracking show contradictory outcomes. Attentively tracking an object in an ambiguous apparent motion display can have an upper limit of around 0.4 revolutions per second (rps) [Horowitz, T. S., Holcombe, A. O., Wolfe, J. M., Arsenio, H. C., & DiMase, J. S. (2004). Attentional pursuit is faster than attentional saccade. Journal of Vision, 4, 585-603] or 1rps [Verstraten, F. A., Cavanagh, P., & Labianca, A. T. (2000). Limits of attentive tracking reveal temporal properties of attention. Vision Research, 40, 3651-3664.]. Here, we demonstrate that this difference depends on presentation conditions: an important determinant for the temporal limit of attentive tracking appears to be the duty cycle. Tracking performance at high(er) rates decreases to chance with increasing duty cycle, while at low rates duty cycle hardly has an effect on performance. Results are discussed in terms of (dis)engagement of attention.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Limits of attentive tracking reveal temporal properties of attention

The maximum speed for attentive tracking of targets was measured in three types of (radial) motion displays: ambiguous motion where only attentive tracking produced an impression of direction, apparent motion, and continuous motion. The upper limit for tracking (about 50 deg s-1) was an order of magnitude lower than the maximum speed at which motion can be perceived for some of these stimuli. I...

متن کامل

Cortical Circuit for Binding Object Identity and Location During Multiple-Object Tracking.

Sustained multifocal attention for moving targets requires binding object identities with their locations. The brain mechanisms of identity-location binding during attentive tracking have remained unresolved. In 2 functional magnetic resonance imaging experiments, we measured participants' hemodynamic activity during attentive tracking of multiple objects with equivalent (multiple-object tracki...

متن کامل

Chronic Pain and Selective Attention to Pain Arousing Daily Activity Pictures: Evidence From an Eye Tracking Study

Introduction: According to the pain research literature, attentional bias for pain is the mechanism responsible for the development and maintenance of fear of pain in patients with chronic pain. However, there is still some debate about the exact mechanism and the role of faster engagement versus difficulty in disengagement in the development of attentional bias.  Methods: To investigate ...

متن کامل

Phoneme Classification Using Temporal Tracking of Speech Clusters in Spectro-temporal Domain

This article presents a new feature extraction technique based on the temporal tracking of clusters in spectro-temporal features space. In the proposed method, auditory cortical outputs were clustered. The attributes of speech clusters were extracted as secondary features. However, the shape and position of speech clusters change during the time. The clusters temporally tracked and temporal tra...

متن کامل

Cortical fMRI activation produced by attentive tracking of moving targets.

Attention can be used to keep track of moving items, particularly when there are multiple targets of interest that cannot all be followed with eye movements. Functional magnetic resonance imaging (fMRI) was used to investigate cortical regions involved in attentive tracking. Cortical flattening techniques facilitated within-subject comparisons of activation produced by attentive tracking, visua...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Vision Research

دوره 47  شماره 

صفحات  -

تاریخ انتشار 2007